132 research outputs found

    Unique contributions of perceptual and conceptual humanness to object representations in the human brain

    Get PDF
    The human brain is able to quickly and accurately identify objects in a dynamic visual world. Objects evoke different patterns of neural activity in the visual system, which reflect object category memberships. However, the underlying dimensions of object representations in the brain remain unclear. Recent research suggests that objects similarity to humans is one of the main dimensions used by the brain to organise objects, but the nature of the human-similarity features driving this organisation are still unknown. Here, we investigate the relative contributions of perceptual and conceptual features of humanness to the representational organisation of objects in the human visual system. We collected behavioural judgements of human-similarity of various objects, which were compared with time-resolved neuroimaging responses to the same objects. The behavioural judgement tasks targeted either perceptual or conceptual humanness features to determine their respective contribution to perceived human-similarity. Behavioural and neuroimaging data revealed significant and unique contributions of both perceptual and conceptual features of humanness, each explaining unique variance in neuroimaging data. Furthermore, our results showed distinct spatio-temporal dynamics in the processing of conceptual and perceptual humanness features, with later and more lateralised brain responses to conceptual features. This study highlights the critical importance of social requirements in information processing and organisation in the human brain

    Lateralised dynamic modulations of corticomuscular coherence associated with bimanual learning of rhythmic patterns

    Get PDF
    Supplementary Information: The online version contains supplementary material available at https://doi.org/ 10.1038/s41598-022-10342-5Human movements are spontaneously attracted to auditory rhythms, triggering an automatic activation of the motor system, a central phenomenon to music perception and production. Cortico- muscular coherence (CMC) in the theta, alpha, beta and gamma frequencies has been used as an index of the synchronisation between cortical motor regions and the muscles. Here we investigated how learning to produce a bimanual rhythmic pattern composed of low- and high-pitch sounds affects CMC in the beta frequency band. Electroencephalography (EEG) and electromyography (EMG) from the left and right First Dorsal Interosseus and Flexor Digitorum Superficialis muscles were concurrently recorded during constant pressure on a force sensor held between the thumb and index finger while listening to the rhythmic pattern before and after a bimanual training session. During the training, participants learnt to produce the rhythmic pattern guided by visual cues by pressing the force sensors with their left or right hand to produce the low- and high-pitch sounds, respectively. Results revealed no changes after training in overall beta CMC or beta oscillation amplitude, nor in the correlation between the left and right sides for EEG and EMG separately. However, correlation analyses indicated that left- and right-hand beta EEG–EMG coherence were positively correlated over time before training but became uncorrelated after training. This suggests that learning to bimanually produce a rhythmic musical pattern reinforces lateralised and segregated cortico-muscular communication.This work was supported by a grant from the Australian Research Council (DP170104322)

    Spatial and temporal (non)binding of audiovisual rhythms in sensorimotor synchronisation

    Get PDF
    All data are held in a public repository, available at OSF database (URL access: https://osf.io/2jr48/?view_only=17e3f6f57651418c980832e00d818072).Human movement synchronisation with moving objects strongly relies on visual input. However, auditory information also plays an important role, since real environments are intrinsically multimodal. We used electroencephalography (EEG) frequency tagging to investigate the selective neural processing and integration of visual and auditory information during motor tracking and tested the effects of spatial and temporal congruency between audiovisual modalities. EEG was recorded while participants tracked with their index finger a red flickering (rate fV = 15 Hz) dot oscillating horizontally on a screen. The simultaneous auditory stimulus was modulated in pitch (rate fA = 32 Hz) and lateralised between left and right audio channels to induce perception of a periodic displacement of the sound source. Audiovisual congruency was manipulated in terms of space in Experiment 1 (no motion, same direction or opposite direction), and timing in Experiment 2 (no delay, medium delay or large delay). For both experiments, significant EEG responses were elicited at fV and fA tagging frequencies. It was also hypothesised that intermodulation products corresponding to the nonlinear integration of visual and auditory stimuli at frequencies fV ± fA would be elicited, due to audiovisual integration, especially in Congruent conditions. However, these components were not observed. Moreover, synchronisation and EEG results were not influenced by congruency manipulations, which invites further exploration of the conditions which may modulate audiovisual processing and the motor tracking of moving objects.We thank Ashleigh Clibborn and Ayah Hammoud for their assistance with data collection. This work was supported by a grant from the Australian Research Council (DP170104322, DP220103047). OML is supported by the Portuguese Foundation for Science and Technology and the Portuguese Ministry of Science, Technology and Higher Education, through the national funds, within the scope of the Transitory Disposition of the Decree No. 57/2016, of 29 August, amended by Law No. 57/2017 of 19 July (Ref.: SFRH/BPD/72710/2010

    Getting your sea legs

    Get PDF
    Sea travel mandates changes in the control of the body. The process by which we adapt bodily control to life at sea is known as getting one's sea legs. We conducted the first experimental study of bodily control as maritime novices adapted to motion of a ship at sea. We evaluated postural activity (stance width, stance angle, and the kinematics of body sway) before and during a sea voyage. In addition, we evaluated the role of the visible horizon in the control of body sway. Finally, we related data on postural activity to two subjective experiences that are associated with sea travel; seasickness, and mal de debarquement. Our results revealed rapid changes in postural activity among novices at sea. Before the beginning of the voyage, the temporal dynamics of body sway differed among participants as a function of their (subsequent) severity of seasickness. Body sway measured at sea differed among participants as a function of their (subsequent) experience of mal de debarquement. We discuss implications of these results for general theories of the perception and control of bodily orientation, for the etiology of motion sickness, and for general phenomena of perceptual-motor adaptation and learning

    Laparoscopic extravesical ureteral reimplantation (LEVUR): A systematic review

    Get PDF
    Abstract BACKGROUND: Laparoscopic ureteral reimplantation is a feasible method for treating ureteral pathology with good preliminary results in the literature. In this study, we review medium term results for laparoscopic ureteral reimplantation and discuss current developments of this procedure. METHODS: Medline and Embase databases were searched using relevant key terms to identify reports of paediatric laparoscopic extravesical ureteral reimplantation (LEVUR). Literature reviews, case reports, series of 20 years) were excluded. RESULTS: Five studies were assessed, overall, 69 LEVUR were performed in children. Despite different surgical technique, in all case the technique was respected. Patient demographics, preoperative symptoms, radiological imaging, complications, and postoperative outcomes were analyzed. Median success rate was 96%. Complications were reported in five cases. CONCLUSIONS: This study is limited by the data given in the individual series: varied criteria used for patient selection and outcome as well as inconsistent pre- and post-operative imaging data precluded a meta-analysis. But it demonstrates that the laparoscopic ureteral reimplantation is an effective procedure with good medium-term results. We believe that in well selected patients this procedure will become an established treatment option

    Dynamic modulation of beta band cortico-muscular coupling induced by audio-visual rhythms

    Get PDF
    Human movements often spontaneously fall into synchrony with auditory and visual environmental rhythms. Related behavioral studies have shown that motor responses are automatically and unintentionally coupled with external rhythmic stimuli. However, the neurophysiological processes underlying such motor entrainment remain largely unknown. Here we investigated with electroencephalography (EEG) and electromyography (EMG) the modulation of neural and muscular activity induced by periodic audio and/or visual sequences. The sequences were presented at either 1 Hz or 2 Hz while participants maintained constant finger pressure on a force sensor. The results revealed that although there was no change of amplitude in participants’ EMG in response to the sequences, the synchronization between EMG and EEG recorded over motor areas in the beta (12–40 Hz) frequency band was dynamically modulated, with maximal coherence occurring about 100 ms before each stimulus. These modulations in beta EEG–EMG motor coherence were found for the 2 Hz audio-visual sequences, confirming at a neurophysiological level the enhancement of motor entrainment with multimodal rhythms that fall within preferred perceptual and movement frequency ranges. Our findings identify beta band cortico-muscular coupling as a potential underlying mechanism of motor entrainment, further elucidating the nature of the link between sensory and motor systems in humans

    Does movement amplitude of a co-performer affect individual performance in musical synchronization?

    Get PDF
    Interpersonal coordination in musical ensembles often involves multisensory cues, with visual information about body movements supplementing co-performers’ sounds. Previous research on the influence of movement amplitude of a visual stimulus on basic sensorimotor synchronization has shown mixed results. Uninstructed visuomotor synchronization seems to be influenced by amplitude of a visual stimulus, but instructed visuomotor synchronization is not. While music performance presents a special case of visually mediated coordination, involving both uninstructed (spontaneously coordinating ancillary body movements with co-performers) and instructed (producing sound on a beat) forms of synchronization, the underlying mechanisms might also support rhythmic interpersonal coordination in the general population. We asked whether visual cue amplitude would affect nonmusicians’ synchronization of sound and head movements in a musical drumming task designed to be accessible regardless of musical experience. Given the mixed prior results, we considered two competing hypotheses. H1: higher amplitude visual cues will improve synchronization. H2: different amplitude visual cues will have no effect on synchronization. Participants observed a human-derived motion capture avatar with three levels of movement amplitude, or a still image of the avatar, while drumming along to the beat of tempo-changing music. The moving avatars were always timed to match the music. We measured temporal asynchrony (drumming relative to the music), predictive timing, ancillary movement fluctuation, and cross-spectral coherence of ancillary movements between the participant and avatar. The competing hypotheses were tested using conditional equivalence testing. This method involves using a statistical equivalence test in the event that standard hypothesis tests show no differences. Our results showed no statistical differences across visual cues types. Therefore, we conclude that there is not a strong effect of visual stimulus amplitude on instructed synchronization

    Neural tracking of visual periodic motion

    Get PDF
    Periodicity is a fundamental property of biological systems, including human movement systems. Periodic movements support displacements of the body in the environment as well as interactions and communication between individuals. Here, we use electroencephalography (EEG) to investigate the neural tracking of visual periodic motion, and more specifically, the relevance of spatiotemporal information contained at and between their turning points. We compared EEG responses to visual sinusoidal oscillations versus nonlinear Rayleigh oscillations, which are both typical of human movements. These oscillations contain the same spatiotemporal information at their turning points but differ between turning points, with Rayleigh oscillations having an earlier peak velocity, shown to increase an individual's capacity to produce accurately synchronized movements. EEG analyses highlighted the relevance of spatiotemporal information between the turning points by showing that the brain precisely tracks subtle differences in velocity profiles, as indicated by earlier EEG responses for Rayleigh oscillations. The results suggest that the brain is particularly responsive to velocity peaks in visual periodic motion, supporting their role in conveying behaviorally relevant timing information at a neurophysiological level. The results also suggest key functions of neural oscillations in the Alpha and Beta frequency bands, particularly in the right hemisphere. Together, these findings provide insights into the neural mechanisms underpinning the processing of visual periodic motion and the critical role of velocity peaks in enabling proficient visuomotor synchronization

    Neural tracking and integration of 'self' and 'other' in improvised interpersonal coordination

    Get PDF
    Humans coordinate their movements with one another in a range of everyday activities and skill domains. Optimal joint performance requires the continuous anticipation of and adaptation to each other's movements, especially when actions are spontaneous rather than pre-planned. Here we employ dual-EEG and frequency-tagging techniques to investigate how the neural tracking of self- and other-generated movements supports interpersonal coordination during improvised motion. LEDs flickering at 5.7 and 7.7 Hz were attached to participants’ index fingers in 28 dyads as they produced novel patterns of synchronous horizontal forearm movements. EEG responses at these frequencies revealed enhanced neural tracking of self-generated movement when leading and of other-generated movements when following. A marker of self-other integration at 13.4 Hz (inter-modulation frequency of 5.7 and 7.7 Hz) peaked when no leader was designated, and mutual adaptation and movement synchrony were maximal. Furthermore, the amplitude of EEG responses reflected differences in the capacity of dyads to synchronize their movements, offering a neurophysiologically grounded perspective for understanding perceptual-motor mechanisms underlying joint action. © 2019 Elsevier Inc

    Neural tracking of the musical beat is enhanced by low-frequency sounds

    Get PDF
    Music makes us move, and using bass instruments to build the rhythmic foundations of music is especially effective at inducing people to dance to periodic pulse-like beats. Here, we show that this culturally widespread practice may exploit a neurophysiological mechanism whereby low-frequency sounds shape the neural representations of rhythmic input by boosting selective locking to the beat. Cortical activity was captured using electroencephalography (EEG) while participants listened to a regular rhythm or to a relatively complex syncopated rhythm conveyed either by low tones (130 Hz) or high tones (1236.8 Hz). We found that cortical activity at the frequency of the perceived beat is selectively enhanced compared with other frequencies in the EEG spectrum when rhythms are conveyed by bass sounds. This effect is unlikely to arise from early cochlear processes, as revealed by auditory physiological modeling, and was particularly pronounced for the complex rhythm requiring endogenous generation of the beat. The effect is likewise not attributable to differences in perceived loudness between low and high tones, as a control experiment manipulating sound intensity alone did not yield similar results. Finally, the privileged role of bass sounds is contingent on allocation of attentional resources to the temporal properties of the stimulus, as revealed by a further control experiment examining the role of a behavioral task. Together, our results provide a neurobiological basis for the convention of using bass instruments to carry the rhythmic foundations of music and to drive people to move to the beat
    • …
    corecore